311 research outputs found
Efficiently Learning from Revealed Preference
In this paper, we consider the revealed preferences problem from a learning
perspective. Every day, a price vector and a budget is drawn from an unknown
distribution, and a rational agent buys his most preferred bundle according to
some unknown utility function, subject to the given prices and budget
constraint. We wish not only to find a utility function which rationalizes a
finite set of observations, but to produce a hypothesis valuation function
which accurately predicts the behavior of the agent in the future. We give
efficient algorithms with polynomial sample-complexity for agents with linear
valuation functions, as well as for agents with linearly separable, concave
valuation functions with bounded second derivative.Comment: Extended abstract appears in WINE 201
Revealed cardinal preference
I prove that as long as we allow the marginal utility for money (lambda) to
vary between purchases (similarly to the budget) then the quasi-linear and
the ordinal budget-constrained models rationalize the same data. However, we know that lambda is approximately constant. I provide a simple constructive proof for the necessary and sufficient condition for the constant lambda rationalization, which I argue should replace the Generalized Axiom of
Revealed Preference in empirical studies of consumer behavior.
'Go Cardinals!'
It is the minimal requirement of any scientifi c theory that it is consistent with
the data it is trying to explain. In the case of (Hicksian) consumer theory it was
revealed preference -introduced by Samuelson (1938,1948) - that provided an
empirical test to satisfy this need. At that time most of economic reasoning was
done in terms of a competitive general equilibrium, a concept abstract enough
so that it can be built on the ordinal preferences over baskets of goods - even if
the extremely specialized ones of Arrow and Debreu. However, starting in the
sixties, economics has moved beyond the 'invisible hand' explanation of how
-even competitive- markets operate. A seemingly unavoidable step of this
'revolution' was that ever since, most economic research has been carried out
in a partial equilibrium context. Now, the partial equilibrium approach does
not mean that the rest of the markets are ignored, rather that they are held
constant. In other words, there is a special commodity -call it money - that
reflects the trade-offs of moving purchasing power across markets. As a result,
the basic building block of consumer behavior in partial equilibrium is no longer
the consumer's preferences over goods, rather her valuation of them, in terms
of money. This new paradigm necessitates a new theory of revealed preference
Correlation functions, Bell's inequalities and the fundamental conservation laws
I derive the correlation function for a general theory of two-valued spin
variables that satisfy the fundamental conservation law of angular momentum.
The unique theory-independent correlation function is identical to the quantum
mechanical correlation function. I prove that any theory of correlations of
such discrete variables satisfying the fundamental conservation law of angular
momentum violates the Bell's inequalities. Taken together with the Bell's
theorem, this result has far reaching implications. No theory satisfying
Einstein locality, reality in the EPR-Bell sense, and the validity of the
conservation law can be constructed. Therefore, all local hidden variable
theories are incompatible with fundamental symmetries and conservation laws.
Bell's inequalities can be obeyed only by violating a conservation law. The
implications for experiments on Bell's inequalities are obvious. The result
provides new insight regarding entanglement, and its measures.Comment: LaTeX, 12pt, 11 pages, 2 figure
Testing Consumer Rationality using Perfect Graphs and Oriented Discs
Given a consumer data-set, the axioms of revealed preference proffer a binary
test for rational behaviour. A natural (non-binary) measure of the degree of
rationality exhibited by the consumer is the minimum number of data points
whose removal induces a rationalisable data-set.We study the computational
complexity of the resultant consumer rationality problem in this paper. This
problem is, in the worst case, equivalent (in terms of approximation) to the
directed feedback vertex set problem. Our main result is to obtain an exact
threshold on the number of commodities that separates easy cases and hard
cases. Specifically, for two-commodity markets the consumer rationality problem
is polynomial time solvable; we prove this via a reduction to the vertex cover
problem on perfect graphs. For three-commodity markets, however, the problem is
NP-complete; we prove thisusing a reduction from planar 3-SAT that is based
upon oriented-disc drawings
Testing Bell's inequality with two-level atoms via population spectroscopy
We propose a feasible experimental scheme, employing methods of population
spectroscopy with two-level atoms, for a test of Bell's inequality for massive
particles. The correlation function measured in this scheme is the joint atomic
function. An inequality imposed by local realism is violated by any
entangled state of a pair of atoms.Comment: 4 pages, REVTeX, no figures. More info on
http://www.ligo.caltech.edu/~cbrif/science.htm
Diminishing returns and tradeoffs constrain the laboratory optimization of an enzyme
Optimization processes, such as evolution, are constrained by diminishing returns - the closer the optimum, the smaller the benefit per mutation, and by tradeoffs - improvement of one property at the cost of others. However, the magnitude and molecular basis of these parameters, and their effect on evolutionary transitions, remain unknown. Here we pursue a complete functional transition of an enzyme with a >109-fold change in the enzyme's selectivity using laboratory evolution. We observed strong diminishing returns, with the initial mutations conferring >25-fold higher improvements than later ones, and asymmetric tradeoffs whereby the gain/loss ratio of the new/old activity decreased 400-fold from the beginning of the trajectory to its end. We describe the molecular basis for these phenomena and suggest they have an important role in shaping natural proteins. These findings also suggest that the catalytic efficiency and specificity of many natural enzymes may be far from their optimum
Constructive updating/downdating of oblique projectors: a generalization of the Gram-Schmidt process
A generalization of the Gram-Schmidt procedure is achieved by providing
equations for updating and downdating oblique projectors. The work is motivated
by the problem of adaptive signal representation outside the orthogonal basis
setting. The proposed techniques are shown to be relevant to the problem of
discriminating signals produced by different phenomena when the order of the
signal model needs to be adjusted.Comment: As it will appear in Journal of Physics A: Mathematical and
Theoretical (2007
The Combinatorial World (of Auctions) According to GARP
Revealed preference techniques are used to test whether a data set is
compatible with rational behaviour. They are also incorporated as constraints
in mechanism design to encourage truthful behaviour in applications such as
combinatorial auctions. In the auction setting, we present an efficient
combinatorial algorithm to find a virtual valuation function with the optimal
(additive) rationality guarantee. Moreover, we show that there exists such a
valuation function that both is individually rational and is minimum (that is,
it is component-wise dominated by any other individually rational, virtual
valuation function that approximately fits the data). Similarly, given upper
bound constraints on the valuation function, we show how to fit the maximum
virtual valuation function with the optimal additive rationality guarantee. In
practice, revealed preference bidding constraints are very demanding. We
explain how approximate rationality can be used to create relaxed revealed
preference constraints in an auction. We then show how combinatorial methods
can be used to implement these relaxed constraints. Worst/best-case welfare
guarantees that result from the use of such mechanisms can be quantified via
the minimum/maximum virtual valuation function
Nonlocality, Bell's Ansatz and Probability
Quantum Mechanics lacks an intuitive interpretation, which is the cause of a
generally formalistic approach to its use. This in turn has led to a certain
insensitivity to the actual meaning of many words used in its description and
interpretation. Herein, we analyze carefully the possible mathematical meanings
of those terms used in analysis of EPR's contention, that Quantum Mechanics is
incomplete, as well as Bell's work descendant therefrom. As a result, many
inconsistencies and errors in contemporary discussions of nonlocality, as well
as in Bell's Ansatz with respect to the laws of probability, are identified.
Evading these errors precludes serious conflicts between Quantum Mechanics and
both Special Relativity and Philosophy.Comment: 8&1/2 pages revtex; v2: many corrections, clairifications &
extentions, all small; v3: editorial scru
- …